Decomposing word embedding with the capsule network
نویسندگان
چکیده
Word sense disambiguation tries to learn the appropriate of an ambiguous word in a given context. The existing pre-trained language methods and based on multi-embeddings did not explore power unsupervised embedding sufficiently. In this paper, we discuss capsule network-based approach, taking advantage capsule’s potential for recognizing highly overlapping features dealing with segmentation. We propose method decompose into context specific embedding, called CapsDecE2S. is fed network produce its multiple morpheme-like vectors, which are defined as basic semantic units meaning. With attention operations, CapsDecE2S integrates reconstruct vectors context-specific embedding. To train CapsDecE2S, matching training method. method, convert learning binary classification that explicitly learns relation between senses by label non-matching. was experimentally evaluated two tasks, i.e., disambiguation. Results public corpora Word-in-Context English all-words Sense Disambiguation show that, model achieves new state-of-the-art tasks. source code can be downloaded from Github page1 .
منابع مشابه
Recurrent Neural Network with Word Embedding for Complaint Classification
Complaint classification aims at using information to deliver greater insights to enhance user experience after purchasing the products or services. Categorized information can help us quickly collect emerging problems in order to provide a support needed. Indeed, the response to the complaint without the delay will grant users highest satisfaction. In this paper, we aim to deliver a novel appr...
متن کاملChinese Event Extraction Using DeepNeural Network with Word Embedding
A lot of prior work on event extraction has exploited a variety of features to represent events. Such methods have several drawbacks: 1) the features are often specific for a particular domain and do not generalize well; 2) the features are derived from various linguistic analyses and are error-prone; and 3) some features may be expensive and require domain expert. In this paper, we develop a C...
متن کاملEmbedding a Semantic Network in a Word Space
We present a framework for using continuousspace vector representations of word meaning to derive new vectors representing the meaning of senses listed in a semantic network. It is a post-processing approach that can be applied to several types of word vector representations. It uses two ideas: first, that vectors for polysemous words can be decomposed into a convex combination of sense vectors...
متن کاملBayesian Neural Word Embedding
Recently, several works in the domain of natural language processing presented successful methods for word embedding. Among them, the Skip-Gram (SG) with negative sampling, known also as word2vec, advanced the stateof-the-art of various linguistics tasks. In this paper, we propose a scalable Bayesian neural word embedding algorithm that can be beneficial to general item similarity tasks as well...
متن کاملEmbedding Word Similarity with Neural Machine Translation
Neural language models learn word representations, or embeddings, that capture rich linguistic and conceptual information. Here we investigate the embeddings learned by neural machine translation models, a recently-developed class of neural language model. We show that embeddings from translation models outperform those learned by monolingual models at tasks that require knowledge of both conce...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Knowledge Based Systems
سال: 2021
ISSN: ['1872-7409', '0950-7051']
DOI: https://doi.org/10.1016/j.knosys.2020.106611